WAFFLe: Weight Anonymized Factorization for Federated Learning
نویسندگان
چکیده
In domains where data are sensitive or private, there is great value in methods that can learn a distributed manner without the ever leaving local devices. light of this need, federated learning has emerged as popular training paradigm. However, many approaches trade transmitting for communicating updated weight parameters each device. Therefore, successful breach would have otherwise directly compromised instead grants whitebox access to model, which opens door number attacks, including exposing very seeks protect. Additionally, scenarios, individual client devices commonly exhibit high statistical heterogeneity. Many common single global model; while may do well on average, performance degrades when i.i.d. assumption violated, underfitting individuals further from mean and raising questions fairness. To address these issues, we propose Weight Anonymized Factorization Federated Learning (WAFFLe), an approach combines Indian Buffet Process with shared dictionary factors neural networks. Experiments MNIST, FashionMNIST, CIFAR-10 demonstrate WAFFLe’s significant improvement test fairness simultaneously providing extra layer security.
منابع مشابه
Federated Meta-Learning for Recommendation
Recommender systems have been widely studied from the machine learning perspective, where it is crucial to share information among users while preserving user privacy. In this work, we present a federated meta-learning framework for recommendation in which user information is shared at the level of algorithm, instead of model or data adopted in previous approaches. In this framework, user-speci...
متن کاملLearning Anonymized Representations with Adversarial Neural Networks
Statistical methods protecting sensitive information or the identity of the data owner have become critical to ensure privacy of individuals as well as of organizations. This paper investigates anonymization methods based on representation learning and deep neural networks, and motivated by novel informationtheoretical bounds. We introduce a novel training objective for simultaneously training ...
متن کاملFederated Multi-Task Learning
Federated learning poses new statistical and systems challenges in training machinelearning models over distributed networks of devices. In this work, we show thatmulti-task learning is naturally suited to handle the statistical challenges of thissetting, and propose a novel systems-aware optimization method, MOCHA, that isrobust to practical systems issues. Our method and theor...
متن کاملBlue waffle?
?Blue waffle,? or ?blue waffle disease,? is many things, but real is not one of them. It is an urban legend, a myth, a tall tale, a rumor, a hoax, etc. about a fictional sexually transmitted infection [2] (STI). If you do an image search, you?ll find (fake) pictures of blue waffle on the Internet. The blue refers to one of the alleged symptoms, which Reader #2 pointed out, and waffle is slang f...
متن کاملFederated Learning: Strategies for Improving Communication Efficiency
Federated Learning is a machine learning setting where the goal is to train a highquality centralized model with training data distributed over a large number of clients each with unreliable and relatively slow network connections. We consider learning algorithms for this setting where on each round, each client independently computes an update to the current model based on its local data, and ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2022
ISSN: ['2169-3536']
DOI: https://doi.org/10.1109/access.2022.3172945